翻訳と辞書
Words near each other
・ Rémy Raffalli
・ Rémy Riou
・ Rémy Trudel
・ Rémy Vancottem
・ Rémy Vercoutre
・ Rémy Vincent Andrianjanaka
・ Rémy Vogel
・ Rémy Zaugg
・ Rémécourt
・ Rémérangles
・ Réméréville
・ Rénald Metelus
・ Rénelle Lamote
・ Réning
・ Réno-Dépôt
Rényi entropy
・ Réo
・ Réo Department
・ Réotier
・ Réparsac
・ Répcelak
・ Répceszemere
・ Répceszentgyörgy
・ Répcevis
・ Répertoire du goût moderne
・ Répertoire du patrimoine culturel du Québec
・ Répertoire international de la presse musicale
・ Répertoire International de Littérature Musicale
・ Répertoire International des Sources Musicales
・ Répertoire national des certifications professionnelles


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Rényi entropy : ウィキペディア英語版
Rényi entropy
In information theory, the Rényi entropy generalizes the Hartley entropy, the Shannon entropy, the collision entropy and the min-entropy. Entropies quantify the diversity, uncertainty, or randomness of a system. The Rényi entropy is named after Alfréd Rényi.
The Rényi entropy is important in ecology and statistics as indices of diversity. The Rényi entropy is also important in quantum information, where it can be used as a measure of entanglement. In the Heisenberg XY spin chain model, the Rényi entropy as a function of ''α'' can be calculated explicitly by virtue of the fact that it is an automorphic function with respect to a particular subgroup of the modular group. In theoretical computer science, the min-entropy is used in the context of randomness extractors.
== Definition ==
The Rényi entropy of order \alpha, where \alpha \geq 0 and \alpha \neq 1 , is defined as
:H_\alpha(X) = \frac\log\Bigg(\sum_^n p_i^\alpha\Bigg) .〔
Here, X is a discrete random variable with possible outcomes 1,2,...,n and corresponding probabilities p_i \doteq \Pr(X=i) for i=1,\dots,n, and the logarithm is base 2.
If the probabilities are p_i=1/n for all i=1,\dots,n, then all the Rényi entropies of the distribution are equal: H_\alpha(X)=\log n.
In general, for all discrete random variables X, H_\alpha(X) is a non-increasing function in \alpha.
Applications often exploit the following relation between the Rényi entropy and the ''p''-norm of the vector of probabilities:
:H_\alpha(X)=\frac \log \left(\|P\|_\alpha\right) .
Here, the discrete probability distribution P=(p_1,\dots,p_n) is interpreted as a vector in \R^n with p_i\geq 0 and \sum_^ p_i =1.
The Rényi entropy for any \alpha \geq 0 is Schur concave.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Rényi entropy」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.